A day in daylight

Dashboard by:
Affiliations

Johannes Zauner

Technical University of Munich & Max Planck Institute for Biological Cybernetics, Germany

Resshaya Roobini Murukesu

TUMCREATE Ltd. (Singapore)

Manuel Spitschan

Technical University of Munich & Max Planck Institute for Biological Cybernetics, Germany

Preface

This document contains the analysis and results for the event A day in daylight, where people from around the world measured a complete day of light exposure on (and around) 22 September 2025.

Note

Note that this script is optimized to generate plot outputs and objects to implement in a dashboard. Thus the direct outputs of the script might look distorted in places.

Importing data

We first set up all packages needed for the analysis

Next we import the survey data. Data were collected with REDCap, and there is an import script to load the data in.

Connecting light data with survey data

First, we collect a list of available data sets. As we need to compare them to the device ids in the survey, we require only the file without path or extension

Next we check which devices are declared in the survey.

[1] 0
[1] TRUE

No entries are duplicated and the survey device Ids are equal to the light files.

Device and location information

Next, we need to get the time zones of the participants and their coordinates. For this, let’s reduce the complexity of the dataset and clean the data

Record ID 31 did not finish the post-survey, so we lack data on that device and consequently remove it. Furthermore, Record ID 30 only has data much outside the time frame of interest. Record ID 40 don’t has sensible data and will also be removed. Likely some issue with a dead battery is the reason.

We also have to clean up the city and country, as well as latitude and longitude data. We do this separately and load the data back in. The manual entries for locations had to be cleaned. This was done with OpenAI through an API key. The results were stored in the file data/cleaned/places.csv. Uncomment the code cell below to recreate the process. Details in outcome may vary, however.

First overview

The following code cells use the data imported so far to create some descriptive plots about the sample.

Reading layer `combined-shapefile-with-oceans-now' from data source 
  `/Users/zauner/Documents/Arbeit/12-TUM/2025_ADayInDaylight_Data/data/tz_now/combined-shapefile-with-oceans-now.shp' 
  using driver `ESRI Shapefile'
Simple feature collection with 92 features and 1 field
Geometry type: MULTIPOLYGON
Dimension:     XY
Bounding box:  xmin: -180 ymin: -90 xmax: 180 ymax: 90
Geodetic CRS:  WGS 84

Import wearable data

Next, we import the light data. There are two devices in use: ActLumus and ActTrust we need to import them separately, as they are using different import functions. device_id with four number indicate the ActLumus devices, whereas with seven numbers the ActTrust. We add a column to the data indicating the Type of device in use. We also make sure that the spelling equals the supported_devices() list from LightLogR. Then we construct filename paths for all files.

[1] TRUE TRUE

We end with one dataset per row entry. As the two ActTrust files do not contain a melanopic EDI column, we will use the photopic illuminance column LIGHT towards that end. As there are only two participants with this shortcoming, it will not influence results overduly.

Further, the dataset in Malaysia had a device malfunction on 22 September and only worked from the 23 September onwards. As there are minimal differences between dates and very few datasets in that region, we will not dismiss that dataset but rather shift data by one day. We need to shift the data by another 8 hours backwards, as that dataset was stored in UTC time (for some reason).

Light data

Cleaning light data

In this section we will prepare the light data through the following steps:

  • resampling data to 5 minute intervals
  • filling in missing data with explicit gaps
  • removing data that does not fall between 2025-09-21 10:00:00 UTC and 2025-09-23 12:00:00 UTC, which contains all times where 22 September occurs somewhere on the planet

Next, we add a secondary Datetime column that runs on UTC time.

Visualizing light data

Now we can visualize the whole dataset - first by combining all datasets.

Events

Cleaning events

In this section we deal with with the activity logs - first by filtering them out of the dataset, and selecting the relevant aspects.

Next, we condense columns that can be expressed as one. We also lose the .factor extension, as now all doubles are removed. Finally, we simplify entries.

Summaries

In this section we will calculate some summary statistics regarding events

Activity logging (by-participant level)
Characteristic N = 471
Log entries 38 (9-80)
Mean duration between log entries 1.69 hours (0.53 hours-3.92 hours)
Total time span of log entries 2.09 days (1.14 days-4.13 days)
1 Mean (Min-Max)
Characteristic N = 1,7631
Indoor settings
    Home 661 (70%)
    Workplace 126 (13%)
    Education 17 (1.8%)
    Commercial 84 (8.9%)
    Healthcare 3 (0.3%)
    Leisure 23 (2.4%)
    Other 27 (2.9%)
Outdoor settings
    Home 51 (15%)
    Workplace 24 (6.8%)
    Education 8 (2.3%)
    Commercial 19 (5.4%)
    Healthcare 0 (0%)
    Leisure 83 (24%)
    Other 166 (47%)
Outdoor-Indoor mixed settings
    Covered patio or terrace 20 (8.1%)
    Semi-open corridor/gallery 11 (4.5%)
    Balcony 3 (1.2%)
    Veranda 5 (2.0%)
    Atrium 1 (0.4%)
    Transportation (car/taxi) 138 (56%)
    Transportation (bus or commuter/regional rail) 31 (13%)
    Transportation (long-distance train) 6 (2.4%)
    Transportation (underground, subway) 6 (2.4%)
    Transportation (airplane) 6 (2.4%)
    Transportation (bike) 5 (2.0%)
    Transportation (ferry) 1 (0.4%)
    Other 14 (5.7%)
1 n (%)
Characteristic N = 1,7631
Specific indoor setting
    Bathroom 86 (9.6%)
    Bedroom 114 (13%)
    Break/lounge area 6 (0.7%)
    Classroom 4 (0.4%)
    Conference/meeting room 18 (2.0%)
    Convenience store/supermarkt 23 (2.6%)
    Corridor 10 (1.1%)
    Dentist 1 (0.1%)
    Drug store 1 (0.1%)
    Home office 85 (9.5%)
    Kitchen 100 (11%)
    Laboratory 2 (0.2%)
    Lecture hall 2 (0.2%)
    Library 2 (0.2%)
    Living room 213 (24%)
    Office supply store 1 (0.1%)
    Open-plan office area 16 (1.8%)
    Other 80 (8.9%)
    Parking garage 10 (1.1%)
    Personal workspace/desk 72 (8.0%)
    Restaurant/cafeteria/bakery 32 (3.6%)
    Shopping mall 16 (1.8%)
    Studio 1 (0.1%)
1 n (%)
Characteristic N = 1,7631
Wear type: Are you wearing the light logger at the moment?
    Wear 1,535 (87%)
    Non-wear 132 (7.5%)
    Bedtime 88 (5.0%)
Activity
    Sedentary 728 (47%)
    Light activity 693 (45%)
    Moderate activity 103 (6.7%)
    High-intensity activity 11 (0.7%)
Non-wear wearable position
    Dark stationary 13 (9.8%)
    Dark mobile 22 (17%)
    Stationary 88 (67%)
    Other 9 (6.8%)
Nightstand wearable measurement direction
    Upward 78 (89%)
    Downward 5 (5.7%)
    Other 5 (5.7%)
1 n (%)
Characteristic N = 1,7631
Are you alone or with others?
    Alone 1,028 (59%)
    With others 728 (41%)
    Missing 7
Select the setting
    Indoors 941 (61%)
    Outdoors 351 (23%)
    Mixed 247 (16%)
    Missing 224
Daylight conditions
    Away from window 363 (25%)
    Direct sunlight 216 (15%)
    Near a window 664 (46%)
    Shade / cloudy 210 (14%)
    Missing 310
Electric lighting conditions
    Dim light 242 (19%)
    Lights off 356 (28%)
    Lights on 658 (52%)
    Missing 507
autonomy
    Yes 825 (58%)
    Partly 172 (12%)
    No 428 (30%)
    Missing 338
1 n (%)
Mean daily duration in condition
Daily duration Percent
Bed 5h 59m 25%
Indoors 10h 6m 42%
Mixed 1h 56m 8%
Outdoors 1h 31m 6%
Non-wear 2h 59m 12%
1h 25m 6%
sum 1d 100%

Combining Events with light data

In this step, we expand the light measurements with the event data. To this end, we need to specify start and end times for each log entry and thus state.

State analysis

In this section, we look at melanopic EDI for various conditions.

First, we create a function that takes a variable and a filter variable for that column, and creates a histogram off that basis.

The second plot function takes the same inputs, and creates a doubleplot showing when these instances occur.

Next, we create a function that also takes a variable, and creates a table containing durations, episodes, and some key metrics.

Lastly, a function to create the actual table. It takes both the duration_tibble() and histogram_plot() functions and creates a gt html table based on it.

Time above threshold

In this section we calculate the time above threshold for the single day of 22 September 2025 across latitude and country.

Global perspectives

In this section we look at the global distribution of melanopic EDI, Time above 250lx, and the Dose of light.

Shade data

In this section we calculate day and night times around the globe. As we want to use this information for a looped visualization, we will set a slightly different cutoff and only collect 48 hours.

Melanopic EDI

Light data

Plot

Race to the highest light dose

In this section we create an animated race to the highest light dose.

Export for dashboard

In this section we export key data for the dashboard.